36 research outputs found

    Distinguishing noise from chaos: objective versus subjective criteria using Horizontal Visibility Graph

    Get PDF
    A recently proposed methodology called the Horizontal Visibility Graph (HVG) [Luque {\it et al.}, Phys. Rev. E., 80, 046103 (2009)] that constitutes a geometrical simplification of the well known Visibility Graph algorithm [Lacasa {\it et al.\/}, Proc. Natl. Sci. U.S.A. 105, 4972 (2008)], has been used to study the distinction between deterministic and stochastic components in time series [L. Lacasa and R. Toral, Phys. Rev. E., 82, 036120 (2010)]. Specifically, the authors propose that the node degree distribution of these processes follows an exponential functional of the form P(Îș)∌exp⁥(−λ Îș)P(\kappa)\sim \exp(-\lambda~\kappa), in which Îș\kappa is the node degree and λ\lambda is a positive parameter able to distinguish between deterministic (chaotic) and stochastic (uncorrelated and correlated) dynamics. In this work, we investigate the characteristics of the node degree distributions constructed by using HVG, for time series corresponding to 2828 chaotic maps and 33 different stochastic processes. We thoroughly study the methodology proposed by Lacasa and Toral finding several cases for which their hypothesis is not valid. We propose a methodology that uses the HVG together with Information Theory quantifiers. An extensive and careful analysis of the node degree distributions obtained by applying HVG allow us to conclude that the Fisher-Shannon information plane is a remarkable tool able to graphically represent the different nature, deterministic or stochastic, of the systems under study.Comment: Submitted to PLOS On

    Outlier mining in high-dimensional data using the Jensen-Shannon divergence and graph structure analysis

    Get PDF
    Reliable anomaly/outlier detection algorithms have practical applications in many fields. For instance, anomaly detection allows to filter and clean the data used to train machine learning algorithms, improving their performance. However, outlier mining is challenging when the data is high-dimensional, and different approaches have been proposed for different types of data (temporal, spatial, network, etc). Here we propose a methodology to mine outliers in generic datasets in which it is possible to define a meaningful distance between elements of the dataset. The methodology is based on defining a fully connected, undirected graph, where the nodes are the elements of the dataset and the links have weights that are the distances between the nodes. Outlier scores are defined by analyzing the structure of the graph, in particular, by using the Jensen–Shannon (JS) divergence to compare the distributions of weights of different nodes. We demonstrate the method using a publicly available database of credit-card transactions, where some of the transactions are labeled as frauds. We compare with the performance obtained when using Euclidean distances and graph percolation, and show that the JS divergence leads to performance improvement, but increases the computational cost.Peer ReviewedPostprint (published version

    Information Theory Perspective on Network Robustness

    Full text link
    A crucial challenge in network theory is the study of the robustness of a network after facing a sequence of failures. In this work, we propose a dynamical definition of network's robustness based on Information Theory, that considers measurements of the structural changes caused by failures of the network's components. Failures are defined here, as a temporal process defined in a sequence. The robustness of the network is then evaluated by measuring dissimilarities between topologies after each time step of the sequence, providing a dynamical information about the topological damage. We thoroughly analyze the efficiency of the method in capturing small perturbations by considering both, the degree and distance distributions. We found the network's distance distribution more consistent in capturing network structural deviations, as better reflects the consequences of the failures. Theoretical examples and real networks are used to study the performance of this methodology.Comment: 5 pages, 2 figures, submitte

    Diffusion capacity of single and interconnected networks

    Get PDF
    Understanding diffusive processes in networks is a significant challenge in complexity science. Networks possess a diffusive potential that depends on their topological configuration, but diffusion also relies on the process and initial conditions. This article presents Diffusion Capacity, a concept that measures a node’s potential to diffuse information based on a distance distribution that considers both geodesic and weighted shortest paths and dynamical features of the diffusion process. Diffusion Capacity thoroughly describes the role of individual nodes during a diffusion process and can identify structural modifications that may improve diffusion mechanisms. The article defines Diffusion Capacity for interconnected networks and introduces Relative Gain, which compares the performance of a node in a single structure versus an interconnected one. The method applies to a global climate network constructed from surface air temperature data, revealing a significant change in diffusion capacity around the year 2000, suggesting a loss of the planet’s diffusion capacity that could contribute to the emergence of more frequent climatic events.Research partially supported by Brazilian agencies FAPEMIG, CAPES, and CNPq. P.M.P. acknowledges support from the “Paul and Heidi Brown Preeminent Professorship in ISE, University of Florida”, and RSF 14-41- 00039, Humboldt Research Award (Germany) and LATNA, Higher School of Economics, RF. C.M. acknowledges partial support from Spanish MINECO (PID2021-123994NB-C21) and ICREA ACADEMIA. A.D.- G. knowledges support from the Spanish grants PGC2018-094754-BC22 and PID2021-128005NB-C22, funded by MCIN/AEI/ 10.13039/ 501100011033 and “ERDF A way of making Europe”; and from Generalitat de Catalunya (2021SGR00856). M.G.R acknowledges partial support from FUNDEP.Peer ReviewedPostprint (published version

    Causality and the Entropy-Complexity Plane: Robustness and Missing Ordinal Patterns

    Full text link
    We deal here with the issue of determinism versus randomness in time series. One wishes to identify their relative weights in a given time series. Two different tools have been advanced in the literature to such effect, namely, i) the "causal" entropy-complexity plane [Rosso et al. Phys. Rev. Lett. 99 (2007) 154102] and ii) the estimation of the decay rate of missing ordinal patterns [Amig\'o et al. Europhys. Lett. 79 (2007) 50001, and Carpi et al. Physica A 389 (2010) 2020-2029]. In this work we extend the use of these techniques to address the analysis of deterministic finite time series contaminated with additive noises of different degree of correlation. The chaotic series studied here was via the logistic map (r = 4) to which we added correlated noise (colored noise with f-k Power Spectrum, 0 {\leq} k {\leq} 2) of varying amplitudes. In such a fashion important insights pertaining to the deterministic component of the original time series can be gained. We find that in the entropy-complexity plane this goal can be achieved without additional computations.Comment: submitted to Physica

    Amyotrophic Lateral Sclerosis Multiprotein Biomarkers in Peripheral Blood Mononuclear Cells

    Get PDF
    Amyotrophic lateral sclerosis (ALS) is a fatal progressive motor neuron disease, for which there are still no diagnostic/prognostic test and therapy. Specific molecular biomarkers are urgently needed to facilitate clinical studies and speed up the development of effective treatments.We used a two-dimensional difference in gel electrophoresis approach to identify in easily accessible clinical samples, peripheral blood mononuclear cells (PBMC), a panel of protein biomarkers that are closely associated with ALS. Validations and a longitudinal study were performed by immunoassays on a selected number of proteins. The same proteins were also measured in PBMC and spinal cord of a G93A SOD1 transgenic rat model. We identified combinations of protein biomarkers that can distinguish, with high discriminatory power, ALS patients from healthy controls (98%), and from patients with neurological disorders that may resemble ALS (91%), between two levels of disease severity (90%), and a number of translational biomarkers, that link responses between human and animal model. We demonstrated that TDP-43, cyclophilin A and ERp57 associate with disease progression in a longitudinal study. Moreover, the protein profile changes detected in peripheral blood mononuclear cells of ALS patients are suggestive of possible intracellular pathogenic mechanisms such as endoplasmic reticulum stress, nitrative stress, disturbances in redox regulation and RNA processing.Our results indicate that PBMC multiprotein biomarkers could contribute to determine amyotrophic lateral sclerosis diagnosis, differential diagnosis, disease severity and progression, and may help to elucidate pathogenic mechanisms

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival

    Missing ordinal patterns in correlated noises

    No full text
    Recent research aiming at the distinction between deterministic or stochastic behavior in observational time series has looked into the properties of the “ordinal patterns”. In particular, new insight has been obtained considering the emergence of the so-called “forbidden ordinal patterns". It was shown that deterministic one-dimensional maps always have forbidden ordinal patterns, in contrast with time series generated by an unconstrained stochastic process in which all the patterns appear with probability one. Techniques based on the comparison of this property in an observational time series and in white Gaussian noise were implemented. However, the comparison with correlated stochastic processes was not considered. In this paper we used the concept of “missing ordinal patterns” to study their decay rate as a function of the time series length in three stochastic processes with different degrees of correlation: fractional Brownian motion, fractional Gaussian noise and, noises with f⁻ᔏ power spectrum. We show that the decay rate of “missing ordinal patterns” in these processes depend on their correlation structures. We finally discuss the implications of the present results for the use of these properties as a tool for distinguishing deterministic from stochastic processes

    Dynamics of climate networks

    No full text
    A methodology to analyze dynamical changes in dynamic climate systems based on complex networks and Information Theory quantifiers is discussed. In particular, the square root of the Jensen–Shannon divergence, a measure of dissimilarity between two probability distributions, is used to quantify states in the network evolution process by means of their degree distribution. We explore the evolution of the surface air temperature (SAT) climate network on the Tropical Pacific region. We find that the proposed quantifier is able not only to capture changes in the dynamics of the studied process but also to quantify and compare states in its evolution. The dynamic network topology is investigated for temporal windows of one-year duration over the 1948–2009 period. The use of this novel methodology, allows us to consistently compare the evolving networks topologies and to capture a cyclic behavior consistent with that of El Niño/Southern Oscillation. This cyclic behavior involves alternating states of less/more efficient information transfer during El Niño/La Niña years, respectively, reflecting a higher climatic stability for La Niña years which is in agreement with current observations. The study also detects a change in the dynamics of the network structure, which coincides with the 76/77 climate shift, after which, conditions of less-efficient information transfer are more frequent and intense

    Entropy analysis of the dynamics of El Niño/Southern Oscillation during the Holocene

    Get PDF
    This study explores temporal changes in the dynamics of the Holocene ENSO proxy record of the Laguna Pallcacocha sedimentary data using two entropy quantifiers. In particular, we analyze the possible connections between changes in entropy and epochs of rapid climate change (RCC). Our results indicate that the dynamics of the ENSO proxy record during the RCC interval 9000–8000 BP displays very low entropy (high predictability) that is remarkably different from that of the other RCCs of the Holocene. Both entropy quantifiers point out to the existence of cycles with a period close to 2000 years during the mid-to-late Holocene. Within these cycles, we find a tendency for entropy to increase (predictability to decrease) during the two longer RCC periods (6000–5000 and 3500–2500 BP) which might be associated with the reported increased aridity of the low tropics
    corecore